首页> 外文OA文献 >Narrowing the Modeling Gap: A Cluster-Ranking Approach to Coreference Resolution
【2h】

Narrowing the Modeling Gap: A Cluster-Ranking Approach to Coreference Resolution

机译:缩小建模差距:集群排序的共享方法   解析度

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Traditional learning-based coreference resolvers operate by training themention-pair model for determining whether two mentions are coreferent or not.Though conceptually simple and easy to understand, the mention-pair model islinguistically rather unappealing and lags far behind the heuristic-basedcoreference models proposed in the pre-statistical NLP era in terms ofsophistication. Two independent lines of recent research have attempted toimprove the mention-pair model, one by acquiring the mention-ranking model torank preceding mentions for a given anaphor, and the other by training theentity-mention model to determine whether a preceding cluster is coreferentwith a given mention. We propose a cluster-ranking approach to coreferenceresolution, which combines the strengths of the mention-ranking model and theentity-mention model, and is therefore theoretically more appealing than bothof these models. In addition, we seek to improve cluster rankers via twoextensions: (1) lexicalization and (2) incorporating knowledge of anaphoricityby jointly modeling anaphoricity determination and coreference resolution.Experimental results on the ACE data sets demonstrate the superior performanceof cluster rankers to competing approaches as well as the effectiveness of ourtwo extensions.
机译:传统的基于学习的共指分解器通过训练提及对模型来确定两个提及是否相同而工作。尽管从概念上讲简单易懂,但提及对模型在语言上却不那么吸引人,并且远远落后于本文中提出的基于启发式的共指模型。就复杂性而言,是统计前的NLP时代。最近的两项独立研究试图改进提及对模型,一种方法是通过获取提及排名模型来对给定的照应词对先前的提及进行排名,另一种方法是通过训练实体提及模型来确定先前的集群是否与给定的聚类相关。提到。我们提出了一种用于共指分辨率的聚类排序方法,该方法结合了提及排序模型和实体提及模型的优点,因此在理论上比这两种模型更具吸引力。此外,我们寻求通过以下两种扩展来改进聚类排名:(1)词汇化和(2)通过共同建模隐喻性确定和共指解析来结合隐喻知识.ACE数据集的实验结果表明聚类排名也具有优于竞争方法的性能作为我们两次扩展的有效性。

著录项

  • 作者

    Rahman, Altaf; Ng, Vincent;

  • 作者单位
  • 年度 2014
  • 总页数
  • 原文格式 PDF
  • 正文语种
  • 中图分类

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号